-
Notifications
You must be signed in to change notification settings - Fork 649
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add inputs_k
and inputs_v
args to attention layer
#3379
Closed
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
chiamp
changed the title
split inputs_kv arg in attention layer
Add Sep 28, 2023
inputs_k
and inputs_v
args to attention layer
Codecov Report
@@ Coverage Diff @@
## main #3379 +/- ##
==========================================
+ Coverage 83.60% 83.62% +0.02%
==========================================
Files 56 56
Lines 6746 6767 +21
==========================================
+ Hits 5640 5659 +19
- Misses 1106 1108 +2
|
chiamp
force-pushed
the
attention
branch
8 times, most recently
from
October 4, 2023 22:13
2db0753
to
dc02493
Compare
cgarciae
reviewed
Oct 5, 2023
Left a comment. Otherwise, looks good! |
cgarciae
approved these changes
Oct 5, 2023
Closing after this commit landed. |
8bitmp3
pushed a commit
to 8bitmp3/flax
that referenced
this pull request
Nov 16, 2023
-- f6a222c by Marcus Chiam <marcuschiam@google.com>: split inputs_kv arg in attention layer COPYBARA_INTEGRATE_REVIEW=google#3379 from chiamp:attention f6a222c PiperOrigin-RevId: 572671273
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Currently,
MultiHeadDotProductAttention
layer's call method signature isMultiHeadDotProductAttention.__call__(inputs_q, inputs_kv, mask=None, deterministic=None)
. As discussed in #1737, there are some cases where passing in separate values for the key and values is desired, which isn't possible with the current API. This PR adds two more arguments,inputs_k
andinputs_v
to the call method signature and sets the method signature to the following:MultiHeadDotProductAttention.__call__(inputs_q, inputs_k=None, inputs_v=None, *, inputs_kv=None, mask=None, deterministic=None)
. Note that theinputs_kv
,mask
anddeterministic
args are now keyword arguments.inputs_k
andinputs_v
areNone
, then they will both copy the value ofinputs_q
(i.e. self attention)inputs_v
isNone
, it will copy the value ofinputs_k
(same behavior as the previous API, i.e.module.apply(inputs_q=query, inputs_k=key_value, ...)
is equivalent tomodule.apply(inputs_q=query, inputs_kv=key_value, ...)
)inputs_kv
is not None, bothinputs_k
andinputs_v
will copy the value ofinputs_kv
Users can still use
inputs_kv
but aDeprecationWarning
will be raised andinputs_kv
will be removed in the future.Since self attention can be done using this new API, the
SelfAttention
layer will also raise aDeprecationWarning
and will be removed in the future.Check out #3389 to see examples of how to port your code over to the new API.